14 research outputs found

    Moving out from the focus:Exploring gaze interaction design in games

    Get PDF
    Eye trackers have become an aordable and compelling input device for game interaction that is targeting the PC gaming community. The number of games adopting gaze input for in-game interaction has rapidly increased over the years with examples in mainstream game franchises. However, games have focused on integrating gaze input on top of fully functional games, utilising gaze as a pointing device and a tool for eciency; e.g. for the faster selection of game objects the player looks at to improve their performance. We deem this is limiting because the use of gaze is obvious, it does not harvest the full potential and richness of the eyes, and only considers that players look at game elements to interact with them. Accordingly, this thesis investigates new opportunities for gaze in games by exploring gaze concepts that challenge the interaction metaphor "what you look at is what you get" to propose adopting "not looking" gaze interactions that reflect what we can do with our eyes. Three playful concepts stem out from this principle: (1) playing with tension; (2) playing with peripheral vision; and (3) playing without looking. We operationalise each concept with game prototypes that pose different challenges based on visual attention, perception in the wider visual eld, and the ability to move the eyes with the eyelids closed. These demonstrate that ideas tested playfully can lead to useful solutions. Finally, we look across our work to distil guidelines to design with "not looking" interactions, the use of dramatisation to support the integration of gaze interaction in the game, and the exploration of interactive experiences only possible when taking input from the eyes. We aim to inspire the future of gaze-enabled games with new directions by proposing that there is more to the eyes than where players look

    Exploration of smooth pursuit eye movements for gaze calibration in games

    Get PDF
    Eye tracking offers opportunities to extend novel interfaces and promises new ways of interaction for gameplay. However, gaze has been found challenging to use in dynamic interfaces involving motion. Moving targets are hard to select with state of the art gaze input methods and gaze estimation requires calibration in order to be accurate when offering a successful interaction experience. Smooth pursuit eye movements have been used to solve this new paradigm, but there is not enough information on the behavior of the eyes when performing such eye movement. In this work, we tried to understand the relationship between gaze and motion when performing smooth pursuit movements through the integration of calibration within a videogame. In our rst study, we propose to leverage the attentive gaze behavior of the eyes during gameplay for implicit and continuous re-calibration. We demonstrated this with GazeBall, a retro-inspired version of Atari's BreakOut game in which we continually calibrate gaze based on the ball's movement and the player's inevitable ocular pursuit on the ball. Continuous calibration enabled the extension of the game with a gaze-based `power-up'. In the evaluation of GazeBall, we show that our approach is effective in maintaining highly accurate gaze input over time, while re-calibration remains invisible to the player. GazeBall raised awareness on the lack of information about smooth pursuit for interaction. Therefore, in our second study, we focused on gaining more understanding on the behavior of the eyes. By testing different motion directions and speeds we found anticipatory predictions during gaze trajectory that indicates that the common reaction of the eyes when a moving target is present is not only following but trying to predict and advance the displayed movement

    Socially distanced games: Exploring the future opportunities of remote play

    Get PDF
    Playing games with friends and family provided a way to stay connected and deal with isolation during the COVID-19 pandemic. However, restrictions introduced to co-located events affected how both regular and casual players scheduled, organised, participated and engaged with various games. Through an online survey, we aimed to gain preliminary insights into how the swift switch from physical to remote play – forced by the circumstances – impacted the gameplay experiences and how different players potentially changed their playing habits. Our preliminary results suggest that computer-mediated communication systems successfully allowed the translation of co-located game sessions, but also highlight the emergence of different points of player friction during remote game experiences, e.g., the tediousness of scheduling and setup, miscommunication or playmates’ wellbeing. We discuss future research and design opportunities that explore the potential to augment social game experiences at a distance and debate the future of remote or hybrid play

    KryptonEyed:Playing with Gaze Without Looking

    Get PDF
    As eye-tracking technologies become more affordable, the number of mainstream gaze-enabled games increases. These allow triggering in-game actions when the eyes focus on objects and locations of interest. Such gaze interactions follow the interaction paradigm ”what you look at is what you get”. We challenge this use of gaze interaction and propose to play without looking - with the eyes closed. We designed the game prototype KryptonEyed to introduce closing the eyes for eyes-only game control. Players are required to close their eyes and perform eye movements behind the eyelids before opening them to aim the teleportation of the main character. The game contains three levels integrating the proposed gaze mechanic in distinct game scenarios. These explore different challenges in their game dynamics and interaction metaphors to use the technique in various contexts of play

    Gaze+Hold: Eyes-only Direct Manipulation with Continuous Gaze Modulated by Closure of One Eye

    Get PDF
    The eyes are coupled in their gaze function and therefore usually treated as a single input channel, limiting the range of interactions. However, people are able to open and close one eye while still gazing with the other. We introduce Gaze+Hold as an eyes-only technique that builds on this ability to leverage the eyes as separate input channels, with one eye modulating the state of interaction while the other provides continuous input. Gaze+Hold enables direct manipulation beyond pointing which we explore through the design of Gaze+Hold techniques for a range of user interface tasks. In a user study, we evaluated performance, usability and user’s spontaneous choice of eye for modulation of input. The results show that users are effective with Gaze+Hold. The choice of dominant versus non-dominant eye had no effect on performance, perceived usability and workload. This is significant for the utility of Gaze+Hold as it affords flexibility for mapping of either eye in different configurations

    Smooth-i:smart re-calibration using smooth pursuit eye movements

    Get PDF
    Eye gaze for interaction is dependent on calibration. However, gaze calibration can deteriorate over time affecting the usability of the system. We propose to use motion matching of smooth pursuit eye movements and known motion on the display to determine when there is a drift in accuracy and use it as input for re-calibration. To explore this idea we developed Smooth-i, an algorithm that stores calibration points and updates them incrementally when inaccuracies are identified. To validate the accuracy of Smooth-i, we conducted a study with five participants and a remote eye tracker. A baseline calibration profile was used by all participants to test the accuracy of the Smooth-i re-calibration following interaction with moving targets. Results show that Smooth-i is able to manage re-calibration efficiently, updating the calibration profile only when inaccurate data samples are detected

    Gaze Behaviour on Interacted Objects during Hand Interaction in Virtual Reality for Eye Tracking Calibration

    Get PDF
    In this paper, we investigate the probability and timing of attaining gaze fixations on interacted objects during hand interaction in virtual reality, with the main purpose for implicit and continuous eye tracking re-calibration. We conducted an evaluation with 15 participants in which their gaze was recorded while interacting with virtual objects. The data was analysed to find factors influencing the probability of fixations at different phases of interaction for different object types. The results indicate that 1) interacting with stationary objects may be favourable in attaining fixations to moving objects, 2) prolonged and precision-demanding interactions positively influences the probability to attain fixations, 3) performing multiple interactions simultaneously can negatively impact the probability of fixations, and 4) feedback can initiate and end fixations on objects

    Towards Designing Diegetic Gaze in Games:The Use of Gaze Roles and Metaphors

    No full text
    Gaze-based interactions have found their way into the games domain and are frequently employed as a means to support players in their activities. Instead of implementing gaze as an additional game feature via a game-centred approach, we propose a diegetic perspective by introducing gaze interaction roles and gaze metaphors. Gaze interaction roles represent ambiguous mechanics in gaze, whereas gaze metaphors serve as narrative figures that symbolise, illustrate, and are applied to the interaction dynamics. Within this work, the current literature in the field is analysed to seek examples that design around gaze mechanics and follow a diegetic approach that takes roles and metaphors into account. A list of surveyed gaze metaphors related to each gaze role is presented and described in detail. Furthermore, a case study shows the potentials of the proposed approach. Our work aims at contributing to existing frameworks, such as EyePlay, by reflecting on the ambiguous meaning of gaze in games. Through this integrative approach, players are anticipated to develop a deeper connection to the game narrative via gaze, resulting in a stronger experience concerning presence (i.e., being in the game world)

    Put a Label On It! Approaches for Constructing and Contextualizing Bar Chart Physicalizations

    No full text
    Physicalizations represent data through their tangible and material properties. In contrast to screen-based visualizations, there is currently very limited understanding of how to label or annotate physicalizations to support people in interpreting the data encoded by the physicalization. Because of its spatiality, contextualization through labeling or annotation is crucial to communicate data across different orientations. In this paper, we study labeling approaches as part of the overall construction process of bar chart physicalizations. We designed a toolkit of physical tokens and paper data labels and asked 16 participants to construct and contextualize their own data physicalizations. We found that (i) the construction and contextualization of physicalizations is a highly intertwined process, (ii) data labels are integrated with physical constructs in the fnal design, and (iii) these are both infuenced by orientation changes. We contribute with an understanding of the role of data labeling in the creation and contextualization of physicalizations

    Behavioral Analysis of Smooth Pursuit Eye Movements for Interaction

    No full text
    Gaze has been found challenging to use in dynamic interfaces involving motion. Moving targets are hard to select with state of the art gaze input methods and gaze estimation requires calibration in order to be accurate when offering a successful experience. Smooth Pursuit eye movements broaden opportunities to extend novel interfaces and promise new ways of interaction. However, there is not enough information on the natural behavior of the eyes when performing them. In this work, we tried to understand the relationship between Smooth Pursuits and motion, focusing on movement speed and direction. Results show anticipatory movements when performing pursuits, indicating that the natural behavior of the eyes to predict the displayed movement. Results could help in the design of interfaces and algorithms that use Smooth Pursuit for interaction
    corecore